Web Survey Bibliography
Relevance & Research Question:
Pre-testing survey instruments is a widely accepted method to test a survey before fielding it to the full sample. There are several advantages of pretesting an online survey, e.g., to collect information regarding survey responses or technical problems, to name only a few.
This study addresses the following questions:
(1) Is pre-testing used before fielding the online survey instrument to the full sample?
(2) Does frequency influence the usage of online survey tools and pre-testing?
(3) Are there differences in usage behavior across different areas (e.g. government, academic, non-profit or for-profit)?
(4) How large is the pretest sample?
Methods & Data:
Data has been collected via a web survey among users of LimeSurvey from July 2009 to October 2011. Of 40,663 responders 14,622 answered the question if they ever ran a pre-test or not. Analyses were conducted using descriptive statistics, cross-tabulations and related statistical tests.
Results:
The core result is that pre-testing online surveys before fielding it to the full sample is now an established method:
(1) Around 65.9% of online survey tool users occasionally or always run a trial survey (pre-test) before fielding it to the full sample.
(2) There is a u-shaped significant relationship between usage frequency of online survey tools and conducting a pre-test (…).
(3) The amount of pre-tests is highest among academic users (62.9%) and lowest among governmental users (58.3%).
(4) Around 64.5 percent of online survey tool users conducted a pre-test using a pre-test sample size of between 1 and 100 cases.
Added Value:
The results of this study provide insights to pre-testing online survey tools. Despite the fact that nearly two-third of online survey tool users are running pre-tests with between 1 and 100 participants, there is potential for the remaining one third of customers to improve their quality of online surveys through pre-testing. Overall, this should lead to a higher acceptance of online surveys.
GOR Homepage (abstract)
Web survey bibliography - Germany (361)
- Metadata on the demographics of online research: Results from a full-range study of available online...; 2013; Burger, C., Stieger, S.
- How the screen-out influence the dropout of a commercial panel; 2013; Bartoli, B.
- Beyond methodology - some ethical implications of "doing research online"; 2013; Heise, N.
- Innovation in Data Collection: the Responsive Design Approach; 2013; Bianchi, A., Biffignandi, S.
- Break-off and attrition in the GIP amongst technologically experienced and inexperienced participants...; 2013; Blom, A. G., Bossert, D., Clark, V., Funke, F., Gebhard, F., Holthausen, A., Krieger, U., Wachenfeld...
- Nonresponse and Nonresponse Bias in a Probability-Based Internet Panel; 2013; Blom, A. G., Bossert, D., Funke, F., Gebhard, F., Holthausen, A., Krieger, U.
- Rewards - Money for Nothing?; 2013; Cape, P. J., Martin, P.
- Effects of incentive reduction after a series of higher incentive waves in a probability-based online...; 2013; Struminskaya, B., Kaczmirek, L., Schaurer, I., Bandilla, W.
- Timing of Nonparticipation in an Online Panel: The effect of incentive strategies; 2013; Douhou, S., Scherpenzeel, A.
- How Do Lotteries and Study Results Influence Response Behavior in Online Panels?; 2013; Goeritz, A., Luthe, S. C.
- Sample composition discrepancies in different stages of a probability-based online panel; 2013; Bosnjak, M., Haas, I., Galesic, M., Kaczmirek, L., Bandilla, W., Couper, M. P.
- Web-based data collection yielded an additional response bias—but had no direct effect on outcome...; 2012; Mayr, A., Gefeller, O., Prokosch, H.-U., Pirkl, A., Froehlich, A. de Zwaan, M.
- Passive measurement of online data in Practice - A White Paper Wakoopa; 2012
- Metering mobile usage. Insights from global Arbitron mobile trends panel; 2012; Verkasalo, H.
- Is „chapterisation“ a viable alternative to traditional progress indicators ?; 2012; Spicer, R., Dowling, Z.
- Online Questionnaires: Development of ‘basic requirements’; 2012; Tries, S., Blanke, K.
- Pros and cons of Internet based User Satisfaction Surveys; 2012; Consoli, A., Matsulevits, L.
- Between demand and reality: Ensuring efficiency and quality in pretesting questionnaires; 2012; Sattelberger, S., Blanke, K.
- How to provide high data quality in online-questionnaires: Setting guidelines in design; 2012; Tries, S., Nebel, S., Blanke, K.
- WebSM Study: Survey software features overview ; 2012; Vehovar, V., Cehovin, G., Kavcic, L., Lenar, J.
- Challenges of assessing the quality of a prerecruited probability-based panel of internet users in...; 2012; Struminskaya, B., Kaczmirek, L.
- Assessing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys; 2012; Behr, D., Braun, M., Kaczmirek, L.
- Comparing Ranking Techniques in Web Surveys; 2012; Blasius, J.
- Design of CAWI Instruments for Social Surveys ; 2012; Blanke, K.
- Enhancing Web Surveys With New HTML5 Input Types; 2012; Funke, F.
- The German Internet Panel: First Results from the Recruitment Phases; 2012; Blom, A. G.
- Assessing the Magnitude of Non-Consent Biases in Linked Survey and Administrative Data; 2012; Sakshaug, J. W., Kreuter, F.
- Marktforschung mit dem iPad-Panel von Axel Springer Media Impact; 2012
- Effects of Personalized Versus Generic Implementation of an Intra-Organizational Online Survey on Psychological...; 2012; Mueller, K., Straatmann, T., Hattrup, K., Jochum, M.
- Exploring New Pathways to Survey Recruitment; 2012; Bilgram, V., Stadler, D.Jawecki, G.
- Does Mode Matter? Initial Evidence from the German Longitudinal Election Study (GLES); 2012; Blumenstiel, J. E., Rossmann, J.
- Surveytainment 2.0: Why investing 10 more minutes more in constructing your questionnaire is worth considering...; 2012; Muehle, A., Tress, F., Schmidt, S., Winkler, T.
- Market research online community (MROC) versus focus group; 2012; Zuber, M.
- Data quality in MAWI and CAWI; 2012; Mavletova, A. M., Blasius, J.
- Scrutinizing Dynamics – Rolling panel waves in theory and practice; 2012; Faas, T., Blumenberg, J. N.
- Little experience with technology as a cause of nonresponse in online surveys; 2012; Struminskaya, B., Schaurer, I., Kaczmirek, L., Bandilla, W.
- Continuous large-scale volunteer web-surveys: The experience of Lohnspiegel and WageIndicator; 2012; Oez, F.
- Is Pretesting Established Among Online Survey Tool Users?; 2012
- An Evaluation of Two Non-Reactive Web Questionnaire Pretesting Methods; 2012; Lenzner, T.
- High potential for mobile Web surveys: Findings from a survey representative for German Internet users...; 2012; Funke, F., Wachenfeld, A.
- Can Social Media Research replace traditional research methods?; 2012; Faber, T., Einhorn, M., Hofmann, O., Loeffler, M.
- Bad Boy Matrix Question – Whatcha gonna do when they come for you?; 2012; Tress, F.
- Effects of Static versus Dynamic Formatting Instructions for Open-Ended Numerical Questions in Web Surveys...; 2012; Kunz, T., Fuchs, M.
- FamilyVote – Conducting online surveys with children and families; 2012; Geissler, H., Peeters, H.
- Assessing the Quality of Survey Data ; 2012; Blasius, J.
- Exploring Animated Faces Scales in Web Surveys: Drawbacks and Prospects; 2012; Emde, M., Fuchs, M.
- Reminders in Web-Based Data Collection: Increasing Response at the Price of Retention?; 2012; Goeritz, A., Crutzen, R.
- Effects of speeding on satisficing in Mixed-Mode Surveys; 2011; Bathelt, S., Bauknecht, J.
- Mixing modes in the LFS - Computer-assisted, cost effective and respondent friendly; 2011; Koerner, T., van der Valk, J.
- Establishing Cross-National Equivalence of Measures of Xenophobia: Evidence from Probing in Web Surveys...; 2011; Braun, M., Behr, D., Kaczmirek, L.